Chernoff-Hoeffding Inequality
ثبت نشده
چکیده
When dealing with modern big data sets, a very common theme is reducing the set through a random process. These generally work by making “many simple estimates” of the full data set, and then judging them as a whole. Perhaps magically, these “many simple estimates” can provide a very accurate and small representation of the large data set. The key tool in showing how many of these simple estimates are needed for a fixed accuracy trade-off is the Chernoff-Hoeffding inequality [2, 5]. This document provides a simple form of this bound, and two examples of its use.
منابع مشابه
Chernoff-Hoeffding Inequality and Applications
When dealing with modern big data sets, a very common theme is reducing the set through a random process. These generally work by making “many simple estimates” of the full data set, and then judging them as a whole. Perhaps magically, these “many simple estimates” can provide a very accurate and small representation of the large data set. The key tool in showing how many of these simple estima...
متن کاملChernoff-Hoeffding Inequality
When dealing with modern big data sets, a very common theme is reducing the set through a random process. These generally work by making “many simple estimates” of the full data set, and then judging them as a whole. Perhaps magically, these “many simple estimates” can provide a very accurate and small representation of the large data set. The key tool in showing how many of these simple estima...
متن کاملLecture 03 : Chernoff Bounds and Intro to Spectral Graph Theory 3 1 . 1 Hoeffding ’ s Inequality
متن کامل
Material for ” Combinatorial multi - armed bandit
We use the following two well known bounds in our proofs. Lemma 1 (Chernoff-Hoeffding bound). Let X1, · · · , Xn be random variables with common support [0, 1] and E[Xi] = μ. Let Sn = X1 + · · ·+Xn. Then for all t ≥ 0, Pr[Sn ≥ nμ+ t] ≤ e−2t /n and Pr[Sn ≤ nμ− t] ≤ e−2t /n Lemma 2 (Bernstein inequality). Let X1, . . . , Xn be independent zero-mean random variables. If for all 1 ≤ i ≤ n, |Xi| ≤ k...
متن کاملFurther Optimal Regret Bounds for Thompson Sampling
The second last inequality follows from the observation that the event E i (t) was defined as μ̂i(t) > xi, At time τk+1 for k ≥ 1, μ̂i(τk+1) = Si(τk+1) k+1 ≤ Si(τk+1) k , where latter is simply the average of the outcomes observed from k i.i.d. plays of arm i, each of which is a Bernoulli trial with mean μi. Using Chernoff-Hoeffding bounds (Fact 1), we obtain that Pr(μ̂i(τk + 1) > xi) ≤ Pr(ik k > ...
متن کامل